Inferential Privacy Guarantees for Differentially Private Mechanisms

نویسندگان

  • Arpita Ghosh
  • Robert D. Kleinberg
چکیده

The correlations and network structure amongst individuals in datasets today—whether explicitly articulated, or deduced from biological or behavioral connections—pose new issues around privacy guarantees, because of inferences that can be made about one individual from another’s data. This motivates quantifying privacy in networked contexts in terms of ‘inferential privacy’—which measures the change in beliefs about an individual’s data from the result of a computation—as originally proposed by Dalenius in the 1970’s. Inferential privacy is implied by differential privacy when data are independent, but can be much worse when data are correlated; indeed, simple examples, as well as a general impossibility theorem of Dwork and Naor, preclude the possibility of achieving non-trivial inferential privacy when the adversary can have arbitrary auxiliary information. In this paper, we ask how differential privacy guarantees translate to guarantees on inferential privacy in networked contexts: specifically, under what limitations on the adversary’s information about correlations, modeled as a prior distribution over datasets, can we deduce an inferential guarantee from a differential one? We prove two main results. The first result pertains to distributions that satisfy a natural positiveaffiliation condition, and gives an upper bound on the inferential privacy guarantee for any differentially private mechanism. This upper bound is matched by a simple mechanism that adds Laplace noise to the sum of the data. The second result pertains to distributions that have weak correlations, defined in terms of a suitable “influence matrix”. The result provides an upper bound for inferential privacy in terms of the differential privacy parameter and the spectral norm of this matrix. ∗Cornell University, Ithaca, NY, USA. [email protected] †Cornell University, Ithaca, NY, USA, and Microsoft Research New England, Cambridge, MA, USA. Partially supported by a Microsoft Research New Faculty Fellowship. [email protected]

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Differentially Private Local Electricity Markets

Privacy-preserving electricity markets have a key role in steering customers towards participation in local electricity markets by guarantying to protect their sensitive information. Moreover, these markets make it possible to statically release and share the market outputs for social good. This paper aims to design a market for local energy communities by implementing Differential Privacy (DP)...

متن کامل

Differentially private instance-based noise mechanisms in practice

Differential privacy is a widely used privacy model today, whose privacy guarantees are obtained to the price of a random perturbation of the result. In some situations, basic differentially private mechanisms may add too much noise to reach a reasonable level of privacy. To answer this shortcoming, several works have provided more technically involved mechanisms, using a new paradigm of differ...

متن کامل

More Flexible Differential Privacy: The Application of Piecewise Mixture Distributions in Query Release

There is an increasing demand to make data “open” to third parties, as data sharing has great benefits in datadriven decision making. However, with a wide variety of sensitive data collected, protecting privacy of individuals, communities and organizations, is an essential factor in making data “open”. The approaches currently adopted by industry in releasing private data are often ad hoc and p...

متن کامل

Featherweight PINQ

Differentially private mechanisms enjoy a variety of composition properties. Leveraging these, McSherry introduced PINQ (SIGMOD 2009), a system empowering non-experts to construct new differentially private analyses. PINQ is an LINQ-like API which provides automatic privacy guarantees for all programs which use it to mediate sensitive data manipulation. In this work we introduce featherweight P...

متن کامل

(Nearly) Optimal Differentially Private Stochastic Multi-Arm Bandits

We study the problem of private stochastic multiarm bandits. Our notion of privacy is the same as some of the earlier works in the general area of private online learning [13, 17, 24]. We design algorithms that are i) differentially private, and ii) have regret guarantees that (almost) match the regret guarantees for the best non-private algorithms (e.g., upper confidence bound sampling and Tho...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017